Conversation
Add Avian (https://avian.io) as a new LLM provider with OpenAI-compatible API support including chat completions, streaming, and function calling. Models: - deepseek/deepseek-v3.2 (164K context, $0.26/$0.38 per 1M tokens) - moonshotai/kimi-k2.5 (131K context, $0.45/$2.20 per 1M tokens) - z-ai/glm-5 (131K context, $0.30/$2.55 per 1M tokens) - minimax/minimax-m2.5 (1M context, $0.30/$1.10 per 1M tokens) Changes: - Add AvianIcon to components/icons.tsx - Add 'avian' to ProviderId type - Add Avian provider definition with model pricing in models.ts - Create providers/avian/ with index.ts and utils.ts - Register provider in registry.ts and utils.ts
|
@avianion is attempting to deploy a commit to the Sim Team on Vercel. A member of the Team first needs to authorize it. |
Greptile SummaryAdded Avian as a new LLM provider with OpenAI-compatible API integration. The implementation follows the established DeepSeek provider pattern, reusing proven architecture for OpenAI SDK-based providers. The integration includes proper tool calling support, streaming responses, cost tracking, and provider metadata registration. All 4 models are correctly configured with pricing and context window specifications. Confidence Score: 5/5
Important Files Changed
Flowchart%%{init: {'theme': 'neutral'}}%%
flowchart TD
A[User selects Avian provider] --> B[Provider Registry]
B --> C[avianProvider from /providers/avian/index.ts]
C --> D[Initialize OpenAI SDK]
D --> E[baseURL: api.avian.io/v1]
E --> F{Request Type?}
F -->|Streaming + No Tools| G[Create stream response]
F -->|With Tools| H[Tool execution loop]
F -->|Standard| I[Standard completion]
H --> J[Execute tools with prepareToolsWithUsageControl]
J --> K[Track forced tool usage]
K --> L[Iterate up to MAX_TOOL_ITERATIONS]
L --> M{Stream final?}
M -->|Yes| G
M -->|No| N[Return response with timing]
G --> O[Calculate costs using model pricing]
I --> O
N --> O
O --> P[Return to user with tokens & cost]
Last reviewed commit: f77f80c |
|
Friendly follow-up — this PR is still active and ready for review. Would appreciate a look when you get a chance! cc @waleedlatif1 @emir-karabeg |
|
Friendly follow-up — this PR is still active and ready for review. All feedback has been addressed. Would appreciate a look when you get a chance! cc @waleedlatif1 |
|
Hey @waleedlatif1 @emir-karabeg — friendly follow-up on this PR. Avian is an OpenAI-compatible inference provider that's already live and powering apps like ISEKAI ZERO. This is a lightweight integration (standard OpenAI-compatible endpoint) and we're happy to address any feedback or make adjustments. Would love to get this merged if you have a moment to review. Thanks! |
Summary
Adds Avian as a new LLM provider. Avian is an OpenAI-compatible inference API offering competitive pricing on frontier models.
Models:
Features supported:
Changes
components/icons.tsx— AddAvianIconSVG componentproviders/types.ts— Add'avian'toProviderIdunion typeproviders/models.ts— Add Avian provider definition with model pricing and capabilitiesproviders/avian/index.ts— Provider implementation using OpenAI SDK (follows DeepSeek pattern)providers/avian/utils.ts— Streaming utility using sharedcreateOpenAICompatibleStreamproviders/registry.ts— RegisteravianProviderin the provider registryproviders/utils.ts— Add Avian to provider metadata mapImplementation
The provider follows the same pattern as other OpenAI-compatible providers (DeepSeek, Cerebras). It uses the
openainpm package withbaseURL: 'https://api.avian.io/v1'and supports all standard features including tool calling loops, forced tool usage, and streaming with cost tracking.Authentication is via user-provided API key (
AVIAN_API_KEYas Bearer token).Test plan
deepseek/deepseek-v3.2modelcc @waleedlatif1 @emir-karabeg